# Denoising Pre-training
Bart Base
Apache-2.0
BART is a Transformer model combining a bidirectional encoder and an autoregressive decoder, suitable for text generation and understanding tasks.
Large Language Model English
B
facebook
2.1M
183
Dialogled Base 16384
DialogLM is a pre-trained model based on the Longformer-Encoder-Decoder (LED) architecture, specifically designed for long dialogue understanding and summarization tasks.
Large Language Model
Transformers Other

D
MingZhong
566
6
Featured Recommended AI Models